Stable Encoding of Finite-State Machines in Discrete-Time Recurrent Neural Nets with Sigmoid Units

نویسندگان

  • Rafael C. Carrasco
  • Mikel L. Forcada
  • M. Ángeles Valdés-Muñoz
  • Ramón P. Ñeco
چکیده

There has been a lot of interest in the use of discrete-time recurrent neural nets (DTRNN) to learn finite-state tasks, with interesting results regarding the induction of simple finite-state machines from input-output strings. Parallel work has studied the computational power of DTRNN in connection with finite-state computation. This article describes a simple strategy to devise stable encodings of finite-state machines in computationally capable discrete-time recurrent neural architectures with sigmoid units and gives a detailed presentation on how this strategy may be applied to encode a general class of finite-state machines in a variety of commonly used first- and second-order recurrent neural networks. Unlike previous work that either imposed some restrictions to state values or used a detailed analysis based on fixed-point attractors, our approach applies to any positive, bounded, strictly growing, continuous activation function and uses simple bounding criteria based on a study of the conditions under which a proposed encoding scheme guarantees that the DTRNN is actually behaving as a finite-state machine.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Encoding of sequential translators in discrete-time recurrent neural nets

In recent years, there has been a lot of interest in the use of discrete-time recurrent neural nets (DTRNN) to learn nite-state tasks, and in the computational power of DTRNN, particularly in connection with nite-state computation. This paper describes a simple strategy to devise stable encodings of sequential nite-state translators (SFST) in a second-order DTRNN with units having bounded, stri...

متن کامل

Efficient encodings of finite automata in discrete-time recurrent neural networks∗

A number of researchers have used discretetime recurrent neural nets (DTRNN) to learn finite-state machines (FSM) from samples of input and output strings; trained DTRNN usually show FSM behaviour for strings up to a certain length, but not beyond; this is usually called instability. Other authors have shown that DTRNN may actually behave as FSM for strings of any length and have devised strate...

متن کامل

Asynchronous translations with recurrent neural nets

In recent years, many researchers have explored the relation between discrete-time recurrent neural networks (DTRNN) and finite-state machines (FSMs) either by showing their computational equivalence or by training them to perform as finite-state recognizers from examples. Most of this work has focussed on the simplest class of deterministic state machines, that is deterministic finite automata...

متن کامل

Stable encoding of large finite-state automata in recurrent neural networks with sigmoid discriminants

We propose an algorithm for encoding deterministic finite-state automata (DFAs) in second-order recurrent neural networks with sigmoidal discriminant function and we prove that the languages accepted by the constructed network and the DFA are identical. The desired finite-state network dynamics is achieved by programming a small subset of all weights. A worst case analysis reveals a relationshi...

متن کامل

Optimal Finite-time Control of Positive Linear Discrete-time Systems

This paper considers solving optimization problem for linear discrete time systems such that closed-loop discrete-time system is positive (i.e., all of its state variables have non-negative values) and also finite-time stable. For this purpose, by considering a quadratic cost function, an optimal controller is designed such that in addition to minimizing the cost function, the positivity proper...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 12 9  شماره 

صفحات  -

تاریخ انتشار 2000